Cutshort logo
Exusia logo
Ab>initio, Big Data, Informatica, Tableau, Data Architect, Cognos, Microstrategy, Healther Business Analysts, Cloud etc.
Ab>initio, Big Data, Informatica, Tableau, Data Architect, Cognos, Microstrategy, Healther Business Analysts, Cloud etc.
Exusia's logo

Ab>initio, Big Data, Informatica, Tableau, Data Architect, Cognos, Microstrategy, Healther Business Analysts, Cloud etc.

Dhaval Upadhyay's profile picture
Posted by Dhaval Upadhyay
1 - 15 yrs
₹5L - ₹10L / yr
Pune, Chicago, Hyderabad, New York
Skills
Abinitio
Cognos
Microstrategy
Business Analysts
Hadoop
Informatica PowerCenter
Tableau
Exusia, Inc. (ex-OO-see-ah: translated from Greek to mean "Immensely Powerful and Agile") was founded with the objective of addressing a growing gap in the data innovation and engineering space as the next global leader in big data, analytics, data integration and cloud computing solutions. Exusia is a multinational, delivery centric firm that provides consulting and software as a service (SaaS) solutions to leading financial, government, healthcare, telecommunications and high technology organizations facing the largest data volumes and the most complex information management requirements. Exusia was founded in the United States in 2012 with headquarters in New York City and regional US offices in Chicago, Atlanta and Los Angeles. Exusia’s international presence continues to expand and is driven from Toronto (Canada), Sao Paulo (Brazil), Johannesburg (South Africa) and Pune (India). Our mission is to empower clients to grow revenue, optimize costs and satisfy regulatory requirements through the innovative use of information and analytics. We leverage a unique blend of strategy, intellectual property, technical execution and outsourcing to enable our clients to achieve significant returns on investment for their business, data and technology initiatives. At the core of our philosophy is a quality-first, trust-building, delivery-focused client relationship. The foundation of this relationship is the talent of our team. By recruiting and retaining the best talent in the industry, we are able to deliver to clients, whose data volumes and requirements number among the largest in the world, a broad range of customized, cutting edge solutions.
Read more
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Shubham Vishwakarma's profile image

Shubham Vishwakarma

Full Stack Developer - Averlon
I had an amazing experience. It was a delight getting interviewed via Cutshort. The entire end to end process was amazing. I would like to mention Reshika, she was just amazing wrt guiding me through the process. Thank you team.
Companies hiring on Cutshort
companies logos

About Exusia

Founded :
2012
Type :
Services
Size :
100-1000
Stage :
Profitable

About

Exusia is a multinational firm that provides consulting and software as a service solutions to leading organizations in healthcare, finance, telecommunications, consumer products, hospitality, supply chain, and high technology industries. It addresses the growing gap in the strategy and data engineering space as the next global leader in analytics, data engineering, and cloud computing solutions. Exusia is ISO 27001 certified and offers managed services to organizations facing the largest data volumes and the most complex data engineering requirements. Exusia was founded in 2012 in New York City and has its Americas headquarters in Miami, European headquarters in London, Africa headquarters in Johannesburg, and Asia headquarters in Pune. It has delivery centers in Pune, Gurugram, Chennai, Hyderabad, and Bangalore. Industries in which the company operates include healthcare, finance, telecommunications, consumer products, hospitality, supply chain, and high technology.
Read more

Connect with the team

Profile picture
Dhaval Upadhyay

Company social profiles

linkedintwitterfacebook

Similar jobs

Solix Technologies
at Solix Technologies
3 recruiters
Sumathi Arramraju
Posted by Sumathi Arramraju
Hyderabad
3 - 7 yrs
₹6L - ₹12L / yr
Hadoop
skill iconJava
HDFS
Spring
Spark
+1 more
Primary Skills required: Java, J2ee, JSP, Servlets, JDBC, Tomcat, Hadoop (hdfs, map reduce, hive, hbase, spark, impala) 
Secondary Skills: Streaming, Archiving , AWS / AZURE / CLOUD

Role:
·         Should have strong programming and support experience in Java, J2EE technologies 
·         Should have good experience in Core Java, JSP, Sevlets, JDBC
·         Good exposure in Hadoop development ( HDFS, Map Reduce, Hive, HBase, Spark)
·         Should have 2+ years of Java experience and 1+ years of experience in Hadoop 
·         Should possess good communication skills
·         Web Services or Elastic \ Map Reduce 
·         Familiarity with data-loading tools such as Sqoop
·         Good to know: Spark, Storm, Apache HBase
Read more
Wissen Technology
at Wissen Technology
4 recruiters
Sukanya Mohan
Posted by Sukanya Mohan
Bengaluru (Bangalore)
3 - 5 yrs
Best in industry
skill iconPython
Apache Spark
Hadoop
SQL

Responsibilities:


• Build customer facing solution for Data Observability product to monitor Data Pipelines

• Work on POCs to build new data pipeline monitoring capabilities.

• Building next-generation scalable, reliable, flexible, high-performance data pipeline capabilities for ingestion of data from multiple sources containing complex dataset.

•Continuously improve services you own, making them more performant, and utilising resources in the most optimised way.

• Collaborate closely with engineering, data science team and product team to propose an optimal solution for a given problem statement

• Working closely with DevOps team on performance monitoring and MLOps


Required Skills:

• 3+ Years of Data related technology experience.

• Good understanding of distributed computing principles

• Experience in Apache Spark

•  Hands on programming with Python

• Knowledge of Hadoop v2, Map Reduce, HDFS

• Experience with building stream-processing systems, using technologies such as Apache Storm, Spark-Streaming or Flink

• Experience with messaging systems, such as Kafka or RabbitMQ

• Good understanding of Big Data querying tools, such as Hive

• Experience with integration of data from multiple data sources

• Good understanding of SQL queries, joins, stored procedures, relational schemas

• Experience with NoSQL databases, such as HBase, Cassandra/Scylla, MongoDB

• Knowledge of ETL techniques and frameworks

• Performance tuning of Spark Jobs

• General understanding of Data Quality is a plus point

• Experience on Databricks,snowflake and BigQuery or similar lake houses would be a big plus

• Nice to have some knowledge in DevOps

Read more
Reality Premedia Services Pvt Ltd
Rinky kamble
Posted by Rinky kamble
Pune
3 - 5 yrs
₹5L - ₹8L / yr
Tableau
PowerBI
Data analyst
skill iconData Analytics
We at Reality Premedia pvt ltd is looking for Data Analyst for Pune location.
Must have experience in BFSC domain.

Exp- Min 3yrs
Location- Pune

Mandatory Skills- Exp in powerBI/Tableau,
SQL 
Basic Python
Read more
Bungee Tech India
Abigail David
Posted by Abigail David
Remote, NCR (Delhi | Gurgaon | Noida), Chennai
5 - 10 yrs
₹10L - ₹30L / yr
Big Data
Hadoop
Apache Hive
Spark
ETL
+3 more

Company Description

At Bungee Tech, we help retailers and brands meet customers everywhere and, on every occasion, they are in. We believe that accurate, high-quality data matched with compelling market insights empowers retailers and brands to keep their customers at the center of all innovation and value they are delivering. 

 

We provide a clear and complete omnichannel picture of their competitive landscape to retailers and brands. We collect billions of data points every day and multiple times in a day from publicly available sources. Using high-quality extraction, we uncover detailed information on products or services, which we automatically match, and then proactively track for price, promotion, and availability. Plus, anything we do not match helps to identify a new assortment opportunity.

 

Empowered with this unrivalled intelligence, we unlock compelling analytics and insights that once blended with verified partner data from trusted sources such as Nielsen, paints a complete, consolidated picture of the competitive landscape.

We are looking for a Big Data Engineer who will work on the collecting, storing, processing, and analyzing of huge sets of data. The primary focus will be on choosing optimal solutions to use for these purposes, then maintaining, implementing, and monitoring them.

You will also be responsible for integrating them with the architecture used in the company.

 

We're working on the future. If you are seeking an environment where you can drive innovation, If you want to apply state-of-the-art software technologies to solve real world problems, If you want the satisfaction of providing visible benefit to end-users in an iterative fast paced environment, this is your opportunity.

 

Responsibilities

As an experienced member of the team, in this role, you will:

 

  • Contribute to evolving the technical direction of analytical Systems and play a critical role their design and development

 

  • You will research, design and code, troubleshoot and support. What you create is also what you own.

 

  • Develop the next generation of automation tools for monitoring and measuring data quality, with associated user interfaces.

 

  • Be able to broaden your technical skills and work in an environment that thrives on creativity, efficient execution, and product innovation.

 

BASIC QUALIFICATIONS

  • Bachelor’s degree or higher in an analytical area such as Computer Science, Physics, Mathematics, Statistics, Engineering or similar.
  • 5+ years relevant professional experience in Data Engineering and Business Intelligence
  • 5+ years in with Advanced SQL (analytical functions), ETL, Data Warehousing.
  • Strong knowledge of data warehousing concepts, including data warehouse technical architectures, infrastructure components, ETL/ ELT and reporting/analytic tools and environments, data structures, data modeling and performance tuning.
  • Ability to effectively communicate with both business and technical teams.
  • Excellent coding skills in Java, Python, C++, or equivalent object-oriented programming language
  • Understanding of relational and non-relational databases and basic SQL
  • Proficiency with at least one of these scripting languages: Perl / Python / Ruby / shell script

 

PREFERRED QUALIFICATIONS

 

  • Experience with building data pipelines from application databases.
  • Experience with AWS services - S3, Redshift, Spectrum, EMR, Glue, Athena, ELK etc.
  • Experience working with Data Lakes.
  • Experience providing technical leadership and mentor other engineers for the best practices on the data engineering space
  • Sharp problem solving skills and ability to resolve ambiguous requirements
  • Experience on working with Big Data
  • Knowledge and experience on working with Hive and the Hadoop ecosystem
  • Knowledge of Spark
  • Experience working with Data Science teams
Read more
Ayoconnect Technology
at Ayoconnect Technology
4 recruiters
Arunasri Dhavala
Posted by Arunasri Dhavala
Remote only
2 - 5 yrs
₹4L - ₹10L / yr
skill iconData Analytics
Data Visualization
Data Analysis
Data validation
Tableau
  • Responsible for gathering, crunching, collecting raw data.
  • Own all data needs in the growth team and across key cross-functional initiatives from data extraction, dashboard creation, and data analysis.
  • Provide insights and recommended actions as response to current situation, trend, as well as preventive one for better preparation.
  • Collaborate with data engineering and cross-functional stakeholders to define data requirements, create dashboard, to drive business decisions and optimize business outcomes
  • Manage any regular reporting and tracking.
  • Deliver analysis, insights, reporting, data marts, and tools to support the business team.
  • Build & maintain data mart and dashboards for tracking business OKR and initiatives.
  • Ingest both internal and external data to support business needs.

Qualification:

  • Bachelor's degree in Engineering, Mathematics, Statistics, Operation Research or other related disciplines.
  • Having 2 - 5 years experience will be an advantage, but fresh graduates are welcomed to apply as well.
  • Expert in Spreadsheet, SQL and strong experience with Data Visualization and Reporting tools (e.g. Tableau, Google Data Studio)
  • Comfortable working independently and collaboratively with minimal guidance.



Read more
MNC
MNC
Agency job
via Fragma Data Systems by geeti gaurav mohanty
Bengaluru (Bangalore)
3 - 5 yrs
₹6L - ₹12L / yr
Spark
Big Data
Data engineering
Hadoop
Apache Kafka
+5 more
Data Engineer

• Drive the data engineering implementation
• Strong experience in building data pipelines
• AWS stack experience is must
• Deliver Conceptual, Logical and Physical data models for the implementation
teams.

• SQL stronghold is must. Advanced SQL working knowledge and experience
working with a variety of relational databases, SQL query authoring
• AWS Cloud data pipeline experience is must. Data pipelines and data centric
applications using distributed storage platforms like S3 and distributed processing
platforms like Spark, Airflow, Kafka
• Working knowledge of AWS technologies such as S3, EC2, EMR, RDS, Lambda,
Elasticsearch
• Ability to use a major programming (e.g. Python /Java) to process data for
modelling.
Read more
Leading Digital marketing ageny
Leading Digital marketing ageny
Agency job
via Talent Socio Bizcon LLP by Vidushi Singh
ukraine
3 - 10 yrs
₹15L - ₹30L / yr
Big Data
skill iconElastic Search
Hadoop
Apache Kafka
Apache Hive
Responsibility: • Studied Computer Science, • 5+ years of software development experience, • Must have experience in Elasticsearch (2+ years experience is preferable), • Skills in Java, Python or Scala, • Passionate about learning big data, data mining and data analysis technologies, • Self-motivated; independent, organized and proactive; highly responsive, flexible, and adaptable when working across multiple teams, • Strong SQL skills, including query optimization are required. • Experience working with large, complex datasets is required, • Experience with recommendation systems and data warehouse technologies are preferred, • You possess an intense curiosity about data and a strong commitment to practical problem-solving, • Creative in thinking data centric products which will be used in online customer behaviors and marketing, • Build systems to pull meaningful insights from our data platform, • Integrate our analytics platform internally across products and teams, • Focus on performance, throughput, latency and drive these throughout our architecture. Bonuses -Experience with big data architectures such as Lambda Architecture. -Experience working with big data technologies (like Hadoop, Java Map/Reduce, Hive, Spark SQL), real-time processing frameworks (like Spark Streaming, Storm, AWS Kinesis). -Proficiency in key-value stores such as : HBase/Cassandra, Redis, Riak and MongoDB -Experience with AWS EMR
Read more
Saama Technologies
at Saama Technologies
6 recruiters
Vaibhav Karpe
Posted by Vaibhav Karpe
Pune
4 - 6 yrs
₹7L - ₹10L / yr
Informatica PowerCenter
MySQL
skill iconData Analytics
Job Description Individuals should have - Good hands on experience of ETL development on tools like Informatica - Good experience in Informatica ETL and Workflow development- Good experience writing complex SQL and doing data analysis- Experience with Oracle SQL and PL/SQL- Experience with Stored Procedures - Good SQL experience & knowledge- Good Experience on UNIX commands- Should have understanding of one of the programming language. - Individuals should have understanding of the cloud based ETL by using Redshift- Understanding of the Healthcare / Life Science domain is big Plus.
Read more
FuGenX Technologies
at FuGenX Technologies
1 video
3 recruiters
Aradya S
Posted by Aradya S
Hyderabad, Bengaluru (Bangalore)
1 - 6 yrs
₹1.3L - ₹6.3L / yr
skill iconPython
skill iconMongoDB
skill iconMachine Learning (ML)
Natural Language Processing (NLP)
Artificial Intelligence (AI)
+2 more
3 or more years of experience in Python programming Good understanding of Object oriented programming concept. Should have working knowledge with flask and mongodb. At least 2 - 3years experience in Machine learning and NLP Proficient with machine learning and deep learning libraries like Pandas,Numpy,Scikit learn,Tensorflow, and Keras, Expertise into deep learning- neural networks would add advantages Proficient with Any Natural language processing library like NLTK or Spacy Hands-on technical experience of Artificial intelligence using open source programming languages and analytics software. Experience using data science techniques such as web scraping, text mining, natural language processing, machine learning, statistical modelling or image recognition Data visualisation skills, e.g. in Tableau or qlikview would add advantage.
Read more
Directi
at Directi
13 recruiters
Richa Pancholy
Posted by Richa Pancholy
Bengaluru (Bangalore)
2 - 8 yrs
₹10L - ₹40L / yr
skill iconAmazon Web Services (AWS)
skill iconPython
Linux/Unix
DevOps
skill iconMongoDB
+8 more
What is the Job like?We are looking for a talented individual to join our DevOps and Platforms Engineering team. You will play an important role in helping build and run our globally distributed infrastructure stack and platforms. Technologies you can expect to work on every day include Linux, AWS, MySQL/PostgreSQL, MongoDB, Hadoop/HBase, ElasticSearch, FreeSwitch, Jenkins, Nagios, and CFEngine amongst others.Responsibilities:- * Troubleshoot and fix production outages and performance issues in our AWS/Linux infrastructure stack* Build automation tools for provisioning and managing our cloud infrastructure by leveraging the AWS API for EC2, S3, CloudFront, RDS and Route53 amongst others* Contribute to enhancing and managing our continuous delivery pipeline* Proactively seek out opportunities to improve monitoring and alerting of our hosts and services, and implement them in a timely fashion* Code scripts and tools to collect and visualize metrics from linux hosts and JVM applications* Enhance and maintain our logs collection, processing and visualization infrastructure* Automate systems configuration by writing policies and modules for configuration management tools* Write both frontend (html/css/js) and backend code (Python, Ruby, Perl)* Participate in periodic oncall rotations for devopsSkills:- * DevOps/System Admin experience ranging between 3-4 years* In depth Linux/Unix knowledge, good understanding the various linux kernel subsystems (memory, storage, network etc)* DNS, TCP/IP, Routing, HA & Load Balancing* Configuration management using tools like CFEngine, Puppet or Chef* SQL and NoSQL databases like MySQL, PostgreSQL, MongoDB and HBase* Build and packaging tools like Jenkins and RPM/Yum* HA and Load balancing using tools like the Elastic Load Balancer and HAProxy* Monitoring tools like Nagios, Pingdom or similar* Log management tools like logstash, fluentd, syslog, elasticsearch or similar* Metrics collection tools like Ganglia, Graphite, OpenTSDB or similar* Programming in a high level language like Python or Ruby
Read more
Why apply to jobs via Cutshort
people_solving_puzzle
Personalized job matches
Stop wasting time. Get matched with jobs that meet your skills, aspirations and preferences.
people_verifying_people
Verified hiring teams
See actual hiring teams, find common social connections or connect with them directly.
ai_chip
Move faster with AI
We use AI to get you faster responses, recommendations and unmatched user experience.
Did not find a job you were looking for?
icon
Search for relevant jobs from 10000+ companies such as Google, Amazon & Uber actively hiring on Cutshort.
companies logo
companies logo
companies logo
companies logo
companies logo
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Shubham Vishwakarma's profile image

Shubham Vishwakarma

Full Stack Developer - Averlon
I had an amazing experience. It was a delight getting interviewed via Cutshort. The entire end to end process was amazing. I would like to mention Reshika, she was just amazing wrt guiding me through the process. Thank you team.
Companies hiring on Cutshort
companies logos